55 research outputs found

    Mekanisma Kawalan Capaian Komputer Mikro di Makmal Komputer Berasaskan Kad Pintar

    Get PDF
    In general, this report covers explanations on four (4) main topics : i) control of usage and PC booking system currently implemented at the labs; ii) basic requirements of the computer security system and its implementation; iii) proposed microcomputer access control mechanism in the lab (SisKAWAL); and iv) development of SisKAWAL. Chapter 1 states the problems, the objectives, the scope and the project significance. Chapter 2 details the methodology as well as the hardware and the software used in the system development. Chapter 3 discusses the current system being implemented to manage the use of PCs in the lab. It also details the implementation of the current control of usage and PC booking system. The explanations include the rules and procedures being used, the scope of the system, functions and responsibilities of the lab administrators. Chapter 4 details the system development as well as the proposed control mechanism. This chapter also discusses the basic security requirements and the security features which are explained from the theoretical point of view and research done by certain quarters. Chapter 5 explains the system design for the physical design, input, output, database, interface, object and algorithma based on the technique and methodology of Object Modelling Technique (OMT). Chapter 6 explains the implementation process of the system development used in the programming routines as well as the testing process on the system functions. Apart from that, the conclusion for the whole of the project such as its advantages, disadvantages, limitations and proposals to upgrade the system performance are discussed at the end of this report

    A Goal and Ontology Based Approach for Generating ETL Process Specifications

    Get PDF
    Data warehouse (DW) systems development involves several tasks such as defining requirements, designing DW schemas, and specifying data transformation operations. Indeed, the success of DW systems is very much dependent on the proper design of the extracting, transforming, and loading (ETL) processes. However, the common design-related problems in the ETL processes such as defining user requirements and data transformation specifications are far from being resolved. These problems are due to data heterogeneity in data sources, ambiguity of user requirements, and the complexity of data transformation activities. Current approaches have limitations on the reconciliation of DW requirement semantics towards designing the ETL processes. As a result, this has prolonged the process of the ETL processes specifications generation. The semantic framework of DW systems established from this study is used to develop the requirement analysis method for designing the ETL processes (RAMEPs) from the different perspectives of organization, decision-maker, and developer by using goal and ontology approaches. The correctness of RAMEPs approach was validated by using modified and newly developed compliant tools. The RAMEPs was evaluated in three real case studies, i.e., Student Affairs System, Gas Utility System, and Graduate Entrepreneur System. These case studies were used to illustrate how the RAMEPs approach can be implemented for designing and generating the ETL processes specifications. Moreover, the RAMEPs approach was reviewed by the DW experts for assessing the strengths and weaknesses of this method, and the new approach is accepted. The RAMEPs method proves that the ETL processes specifications can be derived from the early phases of DW systems development by using the goal-ontology approach

    Data warehouse design for mobile environment

    Get PDF
    Analysis and design are very important roles in Data Warehouse (DW) system development and forms as a backbone of any successful or failure of the DWproject. The emergence trends of ubiquitous-based application required DW system to be implemented in the mobile environment.However, current analysis and design approaches are based on existing DW environments that focusing on the deployment of DW system in static client-based applications. This will create the limitations on user accessed and reduced the used of analytical information by the decision makers.Consequently, this will prolong the adoption of business intelligence (BI) applications to the users and organisations. This research is to suggest an approach for designing the DW and implement the DW system on the mobile environments.A variant dimension modeling technique will be used to enhance the DW modeling in order to accommodate the requirements of mobile characteristics in the DW design

    Al-Quran ontology based on knowledge themes

    Get PDF
    Islamic knowledge is gathered through the understanding the Al-Quran.It requires ontology which can capture the knowledge and present it in a machine readable structured However, current ontology approaches is irrelevant and inaccuracy in producing true concepts of Al-Quran knowledge, because it used traditional methods that only define the concepts of knowledge without connecting to a related theme of knowledge.The themes of knowledge are important to provide true meaning and explanation of Al-Quran knowledge classification.The main aims of this paper are to demonstrate the development of ontology Al-Quran and method used for searching the Al-Quran knowledge using the semantic search approach. Expert review has been applied to validate the ontology model and evaluate the relevance and precision of searching results

    A comparison of GIS packages for geospatial data pre-processing

    Get PDF
    This paper aims to assess the choice of geographic information systems (GIS) used in pre-process geospatial data sets.The study conducted a comparative review of some of the commonly used GIS packages with the aim of proposing the most reliable in terms of consistency, functionality, user-friendliness and cost-effectiveness, which are the determinants in adopting any GIS packages. By this systematic assessment, both the current and potential users will be able to take full advantage of the most efficient GIS package to perform various analytical pre-processing tasks.The outcome of the assessment could be adopted as a guide for selecting an appropriate and reliable open source GIS platform for a timely and efficient pre-preprocessing geospatial data for environmental analysis

    Framework for a semantic data transformation in solving data quality issues in big data

    Get PDF
    Purpose - Today organizations and companies are generating a tremendous amount of data.At the same time, an enormous amount of data is being received and acquired from various resources and being stored which brings us to the era of Big Data (BD). BD is a term used to describe massive datasets that are of diverse format created at a very high speed, the management of which is near impossible by using traditional database management systems (Kanchi et al., 2015). With the dawn of BD, Data Quality (DQ) has become very imperative.Volume, velocity and variety – the initial 3Vs characteristics of BD are usually used to describe the main properties of BD.But for extraction of value (which is another V property) and make BD effective and efficient for organizational decision making, the significance of another V of BD, veracity, is gradually coming to light. Veracity straightly denotes inconsistency and DQ issues.Today, veracity in data analysis is the biggest challenge when compared to other aspects such as volume and velocity. Trusting the data acquired goes a long way in implementing decisions from an automated decision making system and veracity helps to validate the data acquired (Agarwal, Ravikumar, & Saha, 2016).DQ represents an important issue in every business.To be successful, companies need high-quality data on inventory, supplies, customers, vendors and other vital enterprise information in order to run efficiently their data analysis applications (e.g. decision support systems, data mining, customer relationship management) and produce accurate results (McAfee & Brynjolfsson, 2012).During the transformation of huge volume of data, there might exist data mismatch, miscalculation and/or loss of useful data that leads to an unsuccessful data transformation (Tesfagiorgish, & JunYi, 2015) which will in turn leads to poor data quality. In addition of external data, particularly RDF data, increase some challenges for data transformation when compared with the traditional transformation process. For example, the drawbacks of using BD in the business analysis process is that the data is almost schema less, and RDF data contains poor or complex schema. Traditional data transformation tools are not able to process such inconsistent and heterogeneous data because they do not support semantic-aware data, they are entirely schema-dependent and they do not focus on expressive semantic relationships to integrate data from different sources.Thus, BD requires more powerful tools to transform data semantically. While the research on this area so far offer different frameworks, to the best of the researchers knowledge, not much research has been done in relation to transformation of DQ in BD. The much that has been done has not gone beyond cleansing incoming data generally (Merino et al., 2016).The proposed framework presents the method for the analysis of DQ using BD from various domains and applying semantic technologies in the ETL transformation stage to create a semantic model for the enablement of quality in the data

    Cognitive approach using SFL theory in capturing tacit knowledge in business intelligence

    Get PDF
    The complexity of Business Intelligence (BI) processes need to be explored in order to ensure BI system properly treats the tacit knowledge as part of data source in BI framework. Therefore, a new approach in handling tacit knowledge in BI system still needs to be developed. The library is an ideal place to gather tacit knowledge. It is a place full of explicit knowledge stored in various bookshelves. Nevertheless, tacit knowledge is very abundant in the head of the librarians. The explicit knowledge they gained from education in the field of libraries and information was not sufficient to deal with a complex and contextual work environment. Complexity comes from many interconnected affairs that connect librarians with the surrounding environment such as supra-organizations, employees, the physical environment, and library users. This knowledge is contextual because there are various types of libraries and there are different types of library users who demand different management. Since tacit knowledge hard to capture, we need to use all possible sources of externalization of tacit knowledge. The effort to capture this knowledge is done through a social process where the transfer of knowledge takes place from an expert to an interviewer. For this reason, it is important for the interview process to be based on SFL theory (Systemic Functional Linguistics)

    ETL processes specifications generation through goal-ontology approach

    Get PDF
    The common design-related problems for extract, transform, load (ETL) processes are far away from being resolved due to the variation and ambiguity of user requirements and the complexity of ETL operations.These were the fundamental issues of data conflicts in heterogeneous information sharing environments.Current approaches are based on existing software requirement methods that have limitations on reconciliation of the user semantics toward the modeling of the DW. This will prolong the process to generate the ETL processes specifications accordingly.The solution in this paper is focused on the requirement analysis method for designing the ETL processes. The method - RAMEPs (Requirement Analysis Method for ETL Processes) was developed to support the design of ETL processes by analyzing and producing the DW requirements in perspectives of organization, decision-makers, and developers.The ETL processes are modeled and designed by capturing DW schemas and data sources integration and transformation. The validation of RAMEPs emphasizes on the correctness of the goal-oriented and ontology requirement model, and was validated by using compliant tools that support both these approaches.The correctness of RAMEPs was evaluated in three real case studies of Student Affairs System, Gas Utility System, and Graduate Entrepreneur System.These case studies were used to illustrate how the RAMEPs method was implemented in generating the ETL processes specifications

    The implementation of student automation evaluation system using SAS/IntrNet

    Get PDF
    Academic and Student Information System (‘4SIS) was developed by Universiti Utara Malaysia (UUM) to provide information and facilities to process students’ results submitted by UUM lecturers. These functionalities were a part of the whole processes of student information system currently implemented in UUM. But, the constraint was arisen when the current system unable to provide a facility to assist the lecturers to manage the score and perform an evaluation on students’ performance through assignments, quizzes, tests, projects and final examination. Thus, we have developed a system to provide these facilities and become complimentary to the current system - ASIS. The system was developed using PowerDynamo as a front-end and SAS/lntrNet@ as a flavor in ingredients to enrich the output and enhancing interface functionality. The system allows lecturers to create a temporary workspace to input and edit the fields mentioned. The number of fields and the percentage of each field to store score on students’ assessment are determined by the lecturer itself. The system will calculate the contribution of each field and store as a total coursework. With SAS/IntrNet@ the performances of the students’ grades were easily evaluated prior the final scores submitted for result processing. Finally, the total coursework and final examination scores will be submitted online to ASIS when instructed by the lecturers and this will end the student examination processes. Furthermore, the system has successfully supported the implementation of Student Advisory System by providing useful information to the lecturer (Mentor) in order to advice students (Mentee) more effectively, whose main purpose is to help and assist UUM students to boost their academic performance

    Requirements analysis method for extracting-transformation-loading (ETL) in data warehouse systems

    Get PDF
    The data warehouse (DW) system design involves several tasks such as defining the DW schemas and the ETL processes specifications, and these have been extensively studied and practiced for many years.The problems in heterogeneous data integration are still far from being resolved due to the complexity of ETL processes and the fundamental problems of data conflicts in information sharing environments.The understanding of an early phase of DW development is essential in properly tackling the complexity of ETL processes.The method to analyses the DW requirements from the abstract level (e.g. goal, sub-goal, stakeholder, dependency) toward the specification of ETL processes (e.g. extracting, filtering, conversion) are important in order to manage the complexity of the ETL processes design (e.g. semantic heterogeneity problems). However, current approaches that are based on existing software requirement approach still have limitations on translating the business semantics for DW requirements toward the ETL processes specifications.Moreover, the understanding of goal in the perspective of the organization and decision makers are important to ensure the semantic of DW requirements can be properly determined, organized, and implemented by the ETL processes. Therefore, the proposed method will utilize the ontology with goal-driven approach in analyzing the requirements of ETL processes
    • …
    corecore